Trace Norm Regularized Tensor Classification and Its Online Learning Approaches

نویسندگان

  • Ziqiang Shi
  • Tieran Zheng
  • Jiqing Han
چکیده

In this paper we propose an algorithm to classify tensor data. Our methodology is built on recent studies about matrix classification with the trace norm constrained weight matrix and the tensor trace norm. Similar to matrix classification, the tensor classification is formulated as a convex optimization problem which can be solved by using the off-the-shelf accelerated proximal gradient (APG) method. However, there are no analytic solutions as the matrix case for the updating of the weight tensors via the proximal gradient. To tackle this problem, the Douglas-Rachford splitting technique and the alternating direction method of multipliers (ADM) used in tensor completion are adapted to update the weight tensors. Further more, due to the demand of real applications, we also propose its online learning approaches. Experiments demonstrate the efficiency of the methods.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Regularized Orthogonal Tensor Decompositions for Multi-Relational Learning

Multi-relational learning has received lots of attention from researchers in various research communities. Most existing methods either suffer from superlinear per-iteration cost, or are sensitive to the given ranks. To address both issues, we propose a scalable core tensor trace norm Regularized Orthogonal Iteration Decomposition (ROID) method for full or incomplete tensor analytics, which can...

متن کامل

Theoretical and Experimental Analyses of Tensor-Based Regression and Classification

We theoretically and experimentally investigate tensor-based regression and classification. Our focus is regularization with various tensor norms, including the overlapped trace norm, the latent trace norm, and the scaled latent trace norm. We first give dual optimization methods using the alternating direction method of multipliers, which is computationally efficient when the number of trainin...

متن کامل

Trace Norm Regularization: Reformulations, Algorithms, and Multi-Task Learning

We consider a recently proposed optimization formulation of multi-task learning based on trace norm regularized least squares. While this problem may be formulated as a semidefinite program (SDP), its size is beyond general SDP solvers. Previous solution approaches apply proximal gradient methods to solve the primal problem. We derive new primal and dual reformulations of this problem, includin...

متن کامل

Scalable and Sound Low-Rank Tensor Learning

Many real-world data arise naturally as tensors. Equipped with a low rank prior, learning algorithms can benefit from exploiting the rich dependency encoded in a tensor. Despite its prevalence in low-rank matrix learning, trace norm ceases to be tractable in tensors and therefore most existing works resort to matrix unfolding. Although some theoretical guarantees are available, these approaches...

متن کامل

A Dual Framework for Low-rank Tensor Completion

One of the popular approaches for low-rank tensor completion is to use the latent trace norm as a low-rank regularizer. However, most of the existing works learn a sparse combination of tensors. In this work, we fill this gap by proposing a variant of the latent trace norm which helps to learn a non-sparse combination of tensors. We develop a dual framework for solving the problem of latent tra...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • CoRR

دوره abs/1109.1342  شماره 

صفحات  -

تاریخ انتشار 2011